Compare Page

Objectivity

Characteristic Name: Objectivity
Dimension: Reliability and Credibility
Description: Data are unbiased and impartial
Granularity: Information object
Implementation Type: Process-based approach
Characteristic Type: Usage

Verification Metric:

The number of tasks failed or under performed due to biased and partial data
The number of complaints received due to biased or partial data

GuidelinesExamplesDefinitons

The implementation guidelines are guidelines to follow in regard to the characteristic. The scenarios are examples of the implementation

Guidelines: Scenario:
Identify all the factors that make a particular data/information biased for the intended use and take preventive actions to eliminate them (1) A written questionnaire is better than a face to face interviews in getting sensitive personal data
Design and execute preventive actions for all possible information distortions (malfunctioning or personal biases) which may cause by information /data collectors Perform a duel coder approach to code qualitative data.
Design and execute preventive actions for all possible information distortions (malfunctioning or personal biases) which may cause by information /data transmitters (1) After a survey is performed, each participant is contacted individually by a party (other than the person who conducted the survey) and randomly verify if the participants real responses have been marked properly.

Validation Metric:

How mature is the process to prevent biased and partial data

These are examples of how the characteristic might occur in a database.

Example: Source:
Consider an inventory database that contains part numbers, warehouse locations, quantity on hand, and other information. However, it does not contain source information (where the parts came from). If a part is supplied by multiple suppliers, once the parts are received and put on the shelf there is no indication of which supplier the parts came from. The information in the database is always accurate and current. For normal inventory transactions and deci- sion making, the database is certainly of high quality. If a supplier reports that one of their shipments contained defective parts, this database is of no help in identifying whether they have any of those parts or not. The database is of poor quality because it does not contain a relevant element of information. Without that information, the database is poor data quality for the intended use. J. E. Olson, “Data Quality: The Accuracy Dimension”, Morgan Kaufmann Publishers, 9 January 2003.

The Definitions are examples of the characteristic that appear in the sources provided.

Definition: Source:
The degree to which Information is presented without bias, enabling the Knowledge Worker to understand the meaning and significance without misinterpretation. ENGLISH, L. P. 2009. Information quality applied: Best practices for improving business information, processes and systems, Wiley Publishing.
Is the information free of distortion, bias, or error? EPPLER, M. J. 2006. Managing information quality: increasing the value of information in knowledge-intensive products and processes, Springer.
1) Data are unbiased and impartial

2) Objectivity is the extent to which data are unbiased (unprejudiced) and impartial.

WANG, R. Y. & STRONG, D. M. 1996. Beyond accuracy: What data quality means to data consumers. Journal of management information systems, 5-33.

 

Data timeliness

Characteristic Name: Data timeliness
Dimension: Currency
Description: Data which refers to time should be available for use within an acceptable time relative to its time of creation
Granularity: Record
Implementation Type: Process-based approach
Characteristic Type: Usage

Verification Metric:

The number of tasks failed or under performed due to lack of data timeliness
The number of complaints received due to lack of data timeliness

GuidelinesExamplesDefinitons

The implementation guidelines are guidelines to follow in regard to the characteristic. The scenarios are examples of the implementation

Guidelines: Scenario:
Recognise the activity/event that generates the time sensitive attribute values and specify rules to generate attribute values. (1)Efficiency of production line
1) Line out quality check which signifies the end of manufacturing of a product in a lean manufacturing line.
2) The number of products which passed the line out quality checks per given time period is the efficiency of the line
Specify the valid time period for the values of attribute to be recorded (1) The growth of the bacteria should be measured after 15 hours of culturing (2) Efficiency should be calculated and recorded once in every 10 minutes starting from the first 10th minute of an hour (six times per hour)
Specify the valid time period for the values of attribute to be used (1) The exchange rate for the day is valid from 8 am to 8am the following day

Validation Metric:

How mature is the creation and implementation of the DQ rules to handle data timeliness

These are examples of how the characteristic might occur in a database.

Example: Source:
stable data such as birth dates have volatility equal to 0, as they do not vary at all. Conversely, stock quotes, a kind of frequently changing data, have a high degree of volatility due to the fact that they remain valid for very short time intervals. C. Batini and M, Scannapieco, “Data Quality: Concepts, Methodologies, and Techniques”, Springer, 2006.
the quotation of a stock remains valid for only a few seconds irrespective of architectural choices C. Cappiello, C. Francalanci, and B. Pernici, “Time-Related Factors of Data Quality in Multichannel Information System” in Journal of Management Information Systems, Vol. 20, No. 3, M.E. Sharpe, Inc., 2004, pp.71-91.
For example, patient census is needed daily to provide sufficient day-to-day operations staffing, such as nursing and food service. How- ever, annual or monthly patient census data are needed for the facilityís strategic planning. B. Cassidy, et al., “Practice Brief: Data Quality Management Model” in Journal of AHIMA, 1998, 69(6).
consider a system where each user must change own password every 6 months. Those passwords without been updated during more than 6 months, are not valid in the system, and can be treated as absolute stale elements O. Chayka, T. Palpanas, and P. Bouquet, “Defining and Measuring Data-Driven Quality Dimension of Staleness”, Trento: University of Trento, Technical Report # DISI-12-016, 2012.
Consider a database containing sales information for a division of a company. This database contains three years’ worth of data. However, the database is slow to become complete at the end of each month. Some units submit their information immediately, whereas others take several days to send in information. There are also a number of corrections and adjustments that flow in. Thus, for a period of time at the end of the accounting period, the content is incomplete. However, all of the data is correct when complete. If this database is to be used to compute sales bonuses that are due on the 15th of the following month, it is of poor data quality even though the data in it is always eventually accurate. The data is not timely enough for the intended use. However, if this database is to be used for historical trend analysis and to make decisions on altering territories, it is of excellent data quality as long as the user knows when all additions and changes are incorporated. Waiting for all of the data to get in is not a problem because its intended use is to make long-term decisions. J. E. Olson, “Data Quality: The Accuracy Dimension”, Morgan Kaufmann Publishers, 9 January 2003.

The Definitions are examples of the characteristic that appear in the sources provided.

Definition: Source:
A measure of the degree to which data are current and available for use as specified and in the time frame in which they are expected. D. McGilvray, “Executing Data Quality Projects: Ten Steps to Quality Data and Trusted Information”, Morgan Kaufmann Publishers, 2008.
Domain Level: The data element represents the most current information resulting from the output of a business event. Entity Level: The entity represents the most current information resulting from the output of a business event. B. BYRNE, J. K., D. MCCARTY, G. SAUTER, H. SMITH, P WORCESTER 2008. The information perspective of SOA design Part 6:The value of applying the data quality analysis pattern in SOA. IBM corporation.
The “age” of the data is correct for the Knowledge Worker’s purpose . Purposes such as inventory control for Just-in-Time Inventory require the most current data. Comparing sales trends for last period to period one-year ago requires sales data from respective periods. ENGLISH, L. P. 2009. Information quality applied: Best practices for improving business information, processes and systems, Wiley Publishing.
Determines the extent to which data is sufficiently up-to-date for the task at hand. For example, hats, mittens, and scarves are in stock by November. G. GATLING, C. B., R. CHAMPLIN, H. STEFANI, G. WEIGEL 2007. Enterprise Information Management with SAP, Boston, Galileo Press Inc.
Timeliness of data refers to the extent to which data is collected within a reasonable time period from the activity or event and is available within a reasonable timeframe to be used for whatever purpose it is intended. Data should be made available at whatever frequency and within whatever timeframe is needed to support decision making. HIQA 2011. International Review of Data Quality Health Information and Quality Authority (HIQA), Ireland. http://www.hiqa.ie/press-release/2011-04-28-international-review-data-quality.
The currency (age) of the data is appropriate to its use. PRICE, R. J. & SHANKS, G. Empirical refinement of a semiotic information quality framework. System Sciences, 2005. HICSS'05. Proceedings of the 38th Annual Hawaii International Conference on, 2005. IEEE, 216a-216a.
Timeliness can be defined in terms of currency (how recent data are). SCANNAPIECO, M. & CATARCI, T. 2002. Data quality under a computer science perspective. Archivi & Computer, 2, 1-15.
1) The age of an information object.

2) The amount of time the information remains valid in the context of a particular activity.

STVILIA, B., GASSER, L., TWIDALE, M. B. & SMITH, L. C. 2007. A framework for information quality assessment. Journal of the American Society for Information Science and Technology, 58, 1720-1733.
The age of the data is appropriate for the task at hand. WANG, R. Y. & STRONG, D. M. 1996. Beyond accuracy: What data quality means to data consumers. Journal of management information systems, 5-33.